|
There are close parallels between the mathematical expressions for the thermodynamic entropy, usually denoted by ''S'', of a physical system in the statistical thermodynamics established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s, and the information-theoretic entropy, usually expressed as ''H'', of Claude Shannon and Ralph Hartley developed in the 1940s. Shannon, although not initially aware of this similarity, commented on it upon publicizing information theory in ''A Mathematical Theory of Communication''. This article explores what links there are between the two concepts, and how far they can be regarded as connected. ==Equivalence of form of the defining expressions== The defining expression for entropy in the theory of statistical mechanics established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s, is of the form: : where is the probability of the microstate ''i'' taken from an equilibrium ensemble. The defining expression for entropy in the theory of information established by Claude E. Shannon in 1948 is of the form: : where is the probability of the message taken from the message space ''M'' and ''b'' is the base of the logarithm used. Common values of ''b'' are 2, , and 10, and the unit of entropy is bit for ''b'' = 2, nat for ''b'' = , and dit (or digit) for ''b'' = 10.〔Schneider, T.D, (Information theory primer with an appendix on logarithms ), National Cancer Institute, 14 April 2007.〕 Mathematically ''H'' may also be seen as an average information, taken over the message space, because when a certain message occurs with probability ''p''''i'', the information −log(''p''''i'') will be obtained. If all the microstates are equiprobable (a microcanonical ensemble), the statistical thermodynamic entropy reduces to the form, as given by Boltzmann, : where ''W'' is the number of microstates. If all the messages are equiprobable, the information entropy reduces to the Hartley entropy : where is the cardinality of the message space ''M''. The logarithm in the thermodynamic definition is the natural logarithm. It can be shown that the Gibbs entropy formula, with the natural logarithm, reproduces all of the properties of the macroscopic classical thermodynamics of Clausius. (See article: Entropy (statistical views)). The logarithm can also be taken to the natural base in the case of information entropy. This is equivalent to choosing to measure information in nats instead of the usual bits. In practice, information entropy is almost always calculated using base 2 logarithms, but this distinction amounts to nothing other than a change in units. One nat is about 1.44 bits. The presence of Boltzmann's constant ''k'' in the thermodynamic definitions is a historical accident, reflecting the conventional units of temperature. It is there to make sure that the statistical definition of thermodynamic entropy matches the classical entropy of Clausius, thermodynamically conjugate to temperature. For a simple compressible system that can only perform volume work, the first law of thermodynamics becomes : But one can equally well write this equation in terms of what physicists and chemists sometimes call the 'reduced' or dimensionless entropy, σ = ''S''/''k'', so that : Just as ''S'' is conjugate to ''T'', so σ is conjugate to ''kBT'' (the energy that is characteristic of ''T'' on a molecular scale). 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Entropy in thermodynamics and information theory」の詳細全文を読む スポンサード リンク
|